Bert Base Japanese Basic Char V2
This is a Japanese BERT pre-trained model based on character-level tokenization and whole word masking techniques, requiring no dependency on `fugashi` or `unidic_lite` toolkits.
Large Language Model
Transformers Japanese